Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 45
Filter
1.
Foods ; 13(7)2024 Apr 04.
Article in English | MEDLINE | ID: mdl-38611415

ABSTRACT

A review of quantitative risk assessment (QRA) models of Listeria monocytogenes in produce was carried out, with the objective of appraising and contrasting the effectiveness of the control strategies placed along the food chains. Despite nine of the thirteen QRA models recovered being focused on fresh or RTE leafy greens, none of them represented important factors or sources of contamination in the primary production, such as the type of cultivation, water, fertilisers or irrigation method/practices. Cross-contamination at processing and during consumer's handling was modelled using transfer rates, which were shown to moderately drive the final risk of listeriosis, therefore highlighting the importance of accurately representing the transfer coefficient parameters. Many QRA models coincided in the fact that temperature fluctuations at retail or temperature abuse at home were key factors contributing to increasing the risk of listeriosis. In addition to a primary module that could help assess current on-farm practices and potential control measures, future QRA models for minimally processed produce should also contain a refined sanitisation module able to estimate the effectiveness of various sanitisers as a function of type, concentration and exposure time. Finally, L. monocytogenes growth in the products down the supply chain should be estimated by using realistic time-temperature trajectories, and validated microbial kinetic parameters, both of them currently available in the literature.

3.
Foods ; 13(5)2024 Feb 29.
Article in English | MEDLINE | ID: mdl-38472864

ABSTRACT

Better knowledge regarding the Listeria monocytogenes dose-response (DR) model is needed to refine the assessment of the risk of foodborne listeriosis. In 2018, the European Food Safety Agency (EFSA) derived a lognormal Poisson DR model for 14 different age-sex sub-groups, marginally to strain virulence. In the present study, new sets of parameters are developed by integrating the EFSA model for these sub-groups together with three classes of strain virulence characteristics ("less virulent", "virulent", and "more virulent"). Considering classes of virulence leads to estimated relative risks (RRs) of listeriosis following the ingestion of 1000 bacteria of "less virulent" vs. "more virulent" strains ranging from 21.6 to 24.1, depending on the sub-group. These relatively low RRs when compared with RRs linked to comorbidities described in the literature suggest that the influence of comorbidity on the occurrence of invasive listeriosis for a given exposure is much more important than the influence of the virulence of the strains. The updated model parameters allow better prediction of the risk of invasive listeriosis across a population of interest, provided the necessary data on population demographics and the proportional contribution of strain virulence classes in food products of interest are available. An R package is made available to facilitate the use of these dose-response models.

4.
Foods ; 13(5)2024 Feb 26.
Article in English | MEDLINE | ID: mdl-38472829

ABSTRACT

Invasive listeriosis, due to its severe nature in susceptible populations, has been the focus of many quantitative risk assessment (QRA) models aiming to provide a valuable guide in future risk management efforts. A review of the published QRA models of Listeria monocytogenes in seafood was performed, with the objective of appraising the effectiveness of the control strategies at different points along the food chain. It is worth noting, however, that the outcomes of a QRA model are context-specific, and influenced by the country and target population, the assumptions that are employed, and the model architecture itself. Studies containing QRA models were retrieved through a literature search using properly connected keywords on Scopus and PubMed®. All 13 QRA models that were recovered were of short scope, covering, at most, the period from the end of processing to consumption; the majority (85%) focused on smoked or gravad fish. Since the modelled pathways commenced with the packaged product, none of the QRA models addressed cross-contamination events. Many models agreed that keeping the product's temperature at 4.0-4.5 °C leads to greater reductions in the final risk of listeriosis than reducing the shelf life by one week and that the effectiveness of both measures can be surpassed by reducing the initial occurrence of L. monocytogenes in the product (at the end of processing). It is, therefore, necessary that future QRA models for RTE seafood contain a processing module that can provide insight into intervention strategies that can retard L. monocytogenes' growth, such as the use of bacteriocins, ad hoc starter cultures and/or organic acids, and other strategies seeking to reduce cross-contamination at the facilities, such as stringent controls for sanitation procedures. Since risk estimates were shown to be moderately driven by growth kinetic parameters, namely, the exponential growth rate, the minimum temperature for growth, and the maximum population density, further work is needed to reduce uncertainties.

5.
Foods ; 13(3)2024 Jan 23.
Article in English | MEDLINE | ID: mdl-38338495

ABSTRACT

A review of the published quantitative risk assessment (QRA) models of L. monocytogenes in meat and meat products was performed, with the objective of appraising the intervention strategies deemed suitable for implementation along the food chain as well as their relative effectiveness. A systematic review retrieved 23 QRA models; most of them (87%) focused on ready-to-eat meat products and the majority (78%) also covered short supply chains (end processing/retail to consumption, or consumption only). The processing-to-table scope was the choice of models for processed meats such as chorizo, bulk-cooked meat, fermented sausage and dry-cured pork, in which the effects of processing were simulated. Sensitivity analysis demonstrated the importance of obtaining accurate estimates for lag time, growth rate and maximum microbial density, in particular when affected by growth inhibitors and lactic acid bacteria. In the case of deli meats, QRA models showed that delicatessen meats sliced at retail were associated with a higher risk of listeriosis than manufacture pre-packed deli meats. Many models converged on the fact that (1) controlling cold storage temperature led to greater reductions in the final risk than decreasing the time to consumption and, furthermore, that (2) lower numbers and less prevalence of L. monocytogenes at the end of processing were far more effective than keeping low temperatures and/or short times during retail and/or home storage. Therefore, future listeriosis QRA models for meat products should encompass a processing module in order to assess the intervention strategies that lead to lower numbers and prevalence, such as the use of bio-preservation and novel technologies. Future models should be built upon accurate microbial kinetic parameters, and should realistically represent cross-contamination events along the food chain.

6.
Foods ; 12(24)2023 Dec 11.
Article in English | MEDLINE | ID: mdl-38137240

ABSTRACT

A review of the published quantitative risk assessment (QRA) models of L. monocytogenes in dairy products was undertaken in order to identify and appraise the relative effectiveness of control measures and intervention strategies implemented at primary production, processing, retail, and consumer practices. A systematic literature search retrieved 18 QRA models, most of them (9) investigated raw and pasteurized milk cheeses, with the majority covering long supply chains (4 farm-to-table and 3 processing-to-table scopes). On-farm contamination sources, either from shedding animals or from the broad environment, have been demonstrated by different QRA models to impact the risk of listeriosis, in particular for raw milk cheeses. Through scenarios and sensitivity analysis, QRA models demonstrated the importance of the modeled growth rate and lag phase duration and showed that the risk contribution of consumers' practices is greater than in retail conditions. Storage temperature was proven to be more determinant of the final risk than storage time. Despite the pathogen's known ability to reside in damp spots or niches, re-contamination and/or cross-contamination were modeled in only two QRA studies. Future QRA models in dairy products should entail the full farm-to-table scope, should represent cross-contamination and the use of novel technologies, and should estimate L. monocytogenes growth more accurately by means of better-informed kinetic parameters and realistic time-temperature trajectories.

7.
Toxins (Basel) ; 15(7)2023 07 09.
Article in English | MEDLINE | ID: mdl-37505721

ABSTRACT

The present study aims to compare ochratoxin A (OTA) exposure through the intake of three cereal derivative products (bread, pasta and semolina) in two different Moroccan climatic regions (littoral and continental). OTA weekly intakes from cereal products were calculated using a deterministic approach for each region. Results showed a statistically significant difference (p < 0.05) of OTA exposure between the two regions. Indeed, the median OTA exposure was estimated at 48.97 ng/kg b.w./week in the littoral region, while it was estimated at 6.36 ng/kg b.w./week in the continental region. The probabilistic approach showed that, due to uncertainties, the 95th percentile of weekly OTA exposure associated with the three cereal products ranged from 66.18 to 137.79 (95% CI) with a median of 97.44 ng/kg body weight (b.w.)/week. Compared to the threshold of 100 ng/kg b.w./week, 95% of the cumulative distributions predicted an exceedance frequency between 0.42 and 17.30% (95% CI), with an exceedance frequency median of 4.43%. Results showed that cereal derivatives constitute an important vector of OTA exposure and cause a significant exceedance of toxicological reference value among large consumers in the littoral region, which suggests the urgency of reconsidering the maximum regulatory limit (MRL) set for OTA (3 µg/kg) in cereal derivatives by Moroccan authorities.


Subject(s)
Edible Grain , Ochratoxins , Edible Grain/chemistry , Food Contamination/analysis , Ochratoxins/analysis , Bread
8.
Sci Data ; 9(1): 654, 2022 10 26.
Article in English | MEDLINE | ID: mdl-36289246

ABSTRACT

SARS-CoV-2 (Severe acute respiratory syndrome coronavirus 2), a virus causing severe acute respiratory disease in humans, emerged in late 2019. This respiratory virus can spread via aerosols, fomites, contaminated hands or surfaces as for other coronaviruses. Studying their persistence under different environmental conditions represents a key step for better understanding the virus transmission. This work aimed to present a reproducible procedure for collecting data of stability and inactivation kinetics from the scientific literature. The aim was to identify data useful for characterizing the persistence of viruses in the food production plants. As a result, a large dataset related to persistence on matrices or in liquid media under different environmental conditions is presented. This procedure, combining bibliographic survey, data digitalization techniques and predictive microbiological modelling, identified 65 research articles providing 455 coronaviruses kinetics. A ranking step as well as a technical validation with a Gage Repeatability & Reproducibility process were performed to check the quality of the kinetics. All data were deposited in public repositories for future uses by other researchers.


Subject(s)
COVID-19 , SARS-CoV-2 , Humans , Food Handling , Kinetics , Plants, Edible , Reproducibility of Results , Databases, Factual
9.
ALTEX ; 39(4): 667-693, 2022.
Article in English | MEDLINE | ID: mdl-36098377

ABSTRACT

Assessment of potential human health risks associated with environmental and other agents requires careful evaluation of all available and relevant evidence for the agent of interest, including both data-rich and data-poor agents. With the advent of new approach methodologies in toxicological risk assessment, guidance on integrating evidence from mul-tiple evidence streams is needed to ensure that all available data is given due consideration in both qualitative and quantitative risk assessment. The present report summarizes the discussions among academic, government, and private sector participants from North America and Europe in an international workshop convened to explore the development of an evidence-based risk assessment framework, taking into account all available evidence in an appropriate manner in order to arrive at the best possible characterization of potential human health risks and associated uncertainty. Although consensus among workshop participants was not a specific goal, there was general agreement on the key consider-ations involved in evidence-based risk assessment incorporating 21st century science into human health risk assessment. These considerations have been embodied into an overarching prototype framework for evidence integration that will be explored in more depth in a follow-up meeting.


Subject(s)
Risk Assessment , Humans , Europe
10.
Biology (Basel) ; 11(1)2022 Jan 07.
Article in English | MEDLINE | ID: mdl-35053086

ABSTRACT

Food safety is a constant challenge for stakeholders in the food industry. To manage the likelihood of microbiological contamination, food safety management systems must be robust, including food and environmental testing. Environmental monitoring programs (EMP) have emerged this last decade aiming to validate cleaning-sanitation procedures and other environmental pathogen control programs. The need to monitor production environments has become evident because of recent foodborne outbreaks. However, the boundaries of environmental monitoring are not only limited to the management of pathogens but also extend to spoilage and hygiene indicators, microorganisms, allergens, and other hygiene monitoring. Surfaces in production environments can be a source of contamination, either through ineffective cleaning and disinfection procedures or through contamination during production by flows or operators. This study analyses the current practices of 37 French agri-food industries (small, medium, or large), reporting their objectives for EMPs, microbial targets, types, numbers and frequency of sampling, analysis of results, and types of corrective actions.

11.
Foods ; 9(11)2020 Nov 11.
Article in English | MEDLINE | ID: mdl-33187291

ABSTRACT

The foodborne disease burden (FBDB) related to 26 major biological hazards in France was attributed to foods and poor food-handling practices at the final food preparation step, in order to develop effective intervention strategies, especially food safety campaigns. Campylobacter spp. and non-typhoidal Salmonella accounted for more than 60% of the FBDB. Approximately 30% of the FBDB were attributed to 11 other hazards including bacteria, viruses and parasites. Meats were estimated as the main contributing food category causing (50-69%) (CI90) of the FBDB with (33-44%), (9-21%), (4-20%) (CI90) of the FBDB for poultry, pork and beef, respectively. Dairy products, eggs, raw produce and complex foods caused each approximately (5-20%) (CI90) of the FBDB. When foods are contaminated before the final preparation step, we estimated that inadequate cooking, cross-contamination and inadequate storage contribute for (19-49%), (7-34%) and (9-23%) (CI90) of the FBDB, respectively; (15-33%) (CI90) of the FBDB were attributed to the initial contamination of ready-to-eat foods-without any contribution from final food handlers. The thorough implementation of good hygienic practices (GHPs) at the final food preparation step could potentially reduce the FBDB by (67-85%) (CI90) (mainly with the prevention of cross-contamination and adequate cooking and storage).

12.
Foods ; 9(11)2020 Oct 24.
Article in English | MEDLINE | ID: mdl-33114308

ABSTRACT

Entomophagy has been part of human diets for a long time in a significant part of the world, but insects are considered to be a novel food everywhere else. It would appear to be a strategic alternative in the future of human diet to face the challenge of ensuring food security for a growing world population, using more environmentally sustainable production systems than those required for the rearing of other animals. Tenebrio molitor, called yellow mealworm, is one of the most interesting insect species in view of mass rearing, and can be processed into a powder that ensures a long shelf life for its use in many potential products. When considering insects as food or feed, it is necessary to guarantee their safety. Therefore, manufacturers must implement a Hazard Analysis Critical Control plan (HACCP), to limit risks for consumers' health. The aim of this case study was to develop a HACCP plan for Tenebrio molitor larvae powders for food in a risk-based approach to support their implementation in industry. Specific purposes were to identify related significant biological hazards and to assess the efficiency of different manufacturing process steps when used as Critical Control Points. Then, combinations of four different processes with four potential uses of powders by consumers in burger, protein shake, baby porridge, and biscuits were analyzed with regard to their safety.

13.
Appl Environ Microbiol ; 86(18)2020 09 01.
Article in English | MEDLINE | ID: mdl-32680860

ABSTRACT

Temperature and relative humidity are major factors determining virus inactivation in the environment. This article reviews inactivation data regarding coronaviruses on surfaces and in liquids from published studies and develops secondary models to predict coronaviruses inactivation as a function of temperature and relative humidity. A total of 102 D values (i.e., the time to obtain a log10 reduction of virus infectivity), including values for severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), were collected from 26 published studies. The values obtained from the different coronaviruses and studies were found to be generally consistent. Five different models were fitted to the global data set of D values. The most appropriate model considered temperature and relative humidity. A spreadsheet predicting the inactivation of coronaviruses and the associated uncertainty is presented and can be used to predict virus inactivation for untested temperatures, time points, or any coronavirus strains belonging to Alphacoronavirus and Betacoronavirus genera.IMPORTANCE The prediction of the persistence of SARS-CoV-2 on fomites is essential in investigating the importance of contact transmission. This study collects available information on inactivation kinetics of coronaviruses in both solid and liquid fomites and creates a mathematical model for the impact of temperature and relative humidity on virus persistence. The predictions of the model can support more robust decision-making and could be useful in various public health contexts. A calculator for the natural clearance of SARS-CoV-2 depending on temperature and relative humidity could be a valuable operational tool for public authorities.


Subject(s)
Betacoronavirus/physiology , Coronavirus Infections/virology , Models, Biological , Pneumonia, Viral/virology , Virus Inactivation , COVID-19 , Fomites/virology , Humans , Humidity , Pandemics , Public Health , SARS-CoV-2 , Suspensions , Temperature
14.
Front Microbiol ; 10: 2578, 2019.
Article in English | MEDLINE | ID: mdl-31798549

ABSTRACT

With increased interest in source attribution of foodborne pathogens, there is a need to sort and assess the applicability of currently available methods. Herewith we reviewed the most frequently applied methods for source attribution of foodborne diseases, discussing their main strengths and weaknesses to be considered when choosing the most appropriate methods based on the type, quality, and quantity of data available, the research questions to be addressed, and the (epidemiological and microbiological) characteristics of the pathogens in question. A variety of source attribution approaches have been applied in recent years. These methods can be defined as top-down, bottom-up, or combined. Top-down approaches assign the human cases back to their sources of infection based on epidemiological (e.g., outbreak data analysis, case-control/cohort studies, etc.), microbiological (i.e., microbial subtyping), or combined (e.g., the so-called 'source-assigned case-control study' design) methods. Methods based on microbial subtyping are further differentiable according to the modeling framework adopted as frequency-matching (e.g., the Dutch and Danish models) or population genetics (e.g., Asymmetric Island Models and STRUCTURE) models, relying on the modeling of either phenotyping or genotyping data of pathogen strains from human cases and putative sources. Conversely, bottom-up approaches like comparative exposure assessment start from the level of contamination (prevalence and concentration) of a given pathogen in each source, and then go upwards in the transmission chain incorporating factors related to human exposure to these sources and dose-response relationships. Other approaches are intervention studies, including 'natural experiments,' and expert elicitations. A number of methodological challenges concerning all these approaches are discussed. In absence of an universally agreed upon 'gold' standard, i.e., a single method that satisfies all situations and needs for all pathogens, combining different approaches or applying them in a comparative fashion seems to be a promising way forward.

15.
PLoS One ; 14(2): e0213039, 2019.
Article in English | MEDLINE | ID: mdl-30818354

ABSTRACT

Food safety risk assessments and large-scale epidemiological investigations have the potential to provide better and new types of information when whole genome sequence (WGS) data are effectively integrated. Today, the NCBI Pathogen Detection database WGS collections have grown significantly through improvements in technology, coordination, and collaboration, such as the GenomeTrakr and PulseNet networks. However, high-quality genomic data is not often coupled with high-quality epidemiological or food chain metadata. We have created a set of tools for cleaning, curation, integration, analysis and visualization of microbial genome sequencing data. It has been tested using Salmonella enterica and Listeria monocytogenes data sets provided by NCBI Pathogen Detection (160,000 sequenced isolates in 2018). GenomeGraphR presents foodborne pathogen WGS data and associated curated metadata in a user-friendly interface that allows a user to query a variety of research questions such as, transmission sources and dynamics, global reach, and persistence of genotypes associated with contamination in the food supply and foodborne illness across time or space. The application is freely available (https://fda-riskmodels.foodrisk.org/genomegraphr/).


Subject(s)
Food Microbiology , Food Safety , Foodborne Diseases/microbiology , Whole Genome Sequencing/statistics & numerical data , Databases, Genetic , Foodborne Diseases/epidemiology , Genome, Bacterial , Humans , Internet , Listeria monocytogenes/genetics , Listeria monocytogenes/isolation & purification , Listeriosis/epidemiology , Listeriosis/microbiology , Metadata , Molecular Epidemiology , Polymorphism, Single Nucleotide , Risk Assessment , Salmonella Food Poisoning/epidemiology , Salmonella Food Poisoning/microbiology , Salmonella enterica/genetics , Software , User-Computer Interface
17.
EFSA J ; 16(1): e05131, 2018 Jan.
Article in English | MEDLINE | ID: mdl-32625678

ABSTRACT

The qualified presumption of safety (QPS) concept was developed to provide a harmonised generic pre-evaluation to support safety risk assessments of biological agents performed by EFSA's scientific Panels. The identity, body of knowledge, safety concerns and antimicrobial resistance of valid taxonomic units were assessed. Safety concerns identified for a taxonomic unit are, where possible and reasonable in number, considered to be 'qualifications' which should be assessed at the strain level by the EFSA's scientific Panels. No new information was found that would change the previously recommended QPS taxonomic units and their qualifications. The BIOHAZ Panel confirms that the QPS approach can be extended to a genetically modified production strain if the recipient strain qualifies for the QPS status, and if the genetic modification does not indicate a concern. Between April and September 2017, the QPS notification list was updated with 46 applications for market authorisation. From these, 14 biological agents already had QPS status and 16 were not included as they are filamentous fungi or enterococci. One notification of Streptomyces K-61 (notified as former S. griseoviridis) and four of Escherichia coli were not considered for the assessment as they belong to taxonomic units that were excluded from further evaluations within the current QPS mandate. Eight notifications of Bacillus thuringiensis and one of an oomycete are pending the reception of the complete application. Two taxonomic units were evaluated: Kitasatospora paracochleata, which had not been evaluated before, and Komagataella phaffii, previously notified as Pichia pastoris included due to a change in the taxonomic identity. Kitasatospora paracochleata cannot be granted QPS status due to lack of information on its biology and to its possible production of toxic secondary metabolites. The species Komagataella phaffii can be recommended for the QPS list when used for enzyme production.

18.
EFSA J ; 16(1): e05132, 2018 Jan.
Article in English | MEDLINE | ID: mdl-32625679

ABSTRACT

The European Commission asked EFSA for a scientific opinion on chronic wasting disease in two parts. Part one, on surveillance, animal health risk-based measures and public health risks, was published in January 2017. This opinion (part two) addresses the remaining Terms of Reference, namely, 'are the conclusions and recommendations in the EFSA opinion of June 2004 on diagnostic methods for chronic wasting disease still valid? If not, an update should be provided', and 'update the conclusions of the 2010 EFSA opinion on the results of the European Union survey on chronic wasting disease in cervids, as regards its occurrence in the cervid population in the European Union'. Data on the performance of authorised rapid tests in North America are not comprehensive, and are more limited than those available for the tests approved for statutory transmissible spongiform encephalopathies surveillance applications in cattle and sheep. There are no data directly comparing available rapid test performances in cervids. The experience in Norway shows that the Bio-Rad TeSeE™ SAP test, immunohistochemistry and western blotting have detected reindeer, moose and red deer cases. It was shown that testing both brainstem and lymphoid tissue from each animal increases the surveillance sensitivity. Shortcomings in the previous EU survey limited the reliability of inferences that could be made about the potential disease occurrence in Europe. Subsequently, testing activity in Europe was low, until the detection of the disease in Norway, triggering substantial testing efforts in that country. Available data neither support nor refute the conclusion that chronic wasting disease does not occur widely in the EU and do not preclude the possibility that the disease was present in Europe before the survey was conducted. It appears plausible that chronic wasting disease could have become established in Norway more than a decade ago.

19.
EFSA J ; 16(6): e05281, 2018 Jun.
Article in English | MEDLINE | ID: mdl-32625925

ABSTRACT

EFSA received an application from the Dutch Competent Authority, under Article 20 of Regulation (EC) No 1069/2009 and Regulation (EU) No 142/2011, for the evaluation of an alternative method for treatment of Category 3 animal by-products (ABP). It consists of the hydrolysis of the material to short-carbon chains, resulting in medium-chain fatty acids that may contain up to 1% hydrolysed protein, for use in animal feed. A physical process, with ultrafiltration followed by nanofiltration to remove hazards, is also used. Process efficacy has been evaluated based on the ability of the membrane barriers to retain potential biological hazards present. Small viruses passing the ultrafiltration membrane will be retained at the nanofiltration step, which represents a Critical Control Point (CCP) in the process. This step requires the Applicant to validate and provide certification for the specific use of the nanofiltration membranes used. Continuous monitoring and membrane integrity tests should be included as control measures in the HACCP plan. The ultrafiltration and nanofiltration techniques are able to remove particles of the size of virus, bacteria and parasites from liquids. If used under controlled and appropriate conditions, the processing methods proposed should reduce the risk in the end product to a degree which is at least equivalent to that achieved with the processing standards laid down in the Regulation for Category 3 material. The possible presence of small bacterial toxins produced during the fermentation steps cannot be avoided by the nanofiltration step and this hazard should be controlled by a CCP elsewhere in the process. The limitations specified in the current legislation and any future modifications in relation to the end use of the product also apply to this alternative process, and no hydrolysed protein of ruminant origin (except ruminant hides and skins) can be included in feed for farmed animals or for aquaculture.

20.
EFSA J ; 16(7): e05314, 2018 Jul.
Article in English | MEDLINE | ID: mdl-32625957

ABSTRACT

EFSA was requested: to assess the impact of a proposed quantitative real-time polymerase chain reaction (qPCR) 'technical zero' on the limit of detection of official controls for constituents of ruminant origin in feed, to review and update the 2011 QRA, and to estimate the cattle bovine spongiform encephalopathy (BSE) risk posed by the contamination of feed with BSE-infected bovine-derived processed animal protein (PAP), should pig PAP be re-authorised in poultry feed and vice versa, using both light microscopy and ruminant qPCR methods, and action limits of 100, 150, 200, 250 and 300 DNA copies. The current qPCR cannot discriminate between legitimately added bovine material and unauthorised contamination, or determine if any detected ruminant material is associated with BSE infectivity. The sensitivity of the surveillance for the detection of material of ruminant origin in feed is currently limited due to the heterogeneous distribution of the material, practicalities of sampling and test performance. A 'technical zero' will further reduce it. The updated model estimated a total BSE infectivity four times lower than that estimated in 2011, with less than one new case of BSE expected to arise each year. In the hypothetical scenario of a whole carcass of an infected cow entering the feed chain without any removal of specified risk material (SRM) or reduction of BSE infectivity via rendering, up to four new cases of BSE could be expected at the upper 95th percentile. A second model estimated that at least half of the feed containing material of ruminant origin will not be detected or removed from the feed chain, if an interpretation cut-off point of 100 DNA copies or more is applied. If the probability of a contaminated feed sample increased to 5%, with an interpretation cut-off point of 300 DNA copies, there would be a fourfold increase in the proportion of all produced feed that is contaminated but not detected.

SELECTION OF CITATIONS
SEARCH DETAIL
...